# Protein sequence prediction
Prollama Stage 1
Apache-2.0
ProLLaMA is a protein large language model based on the Llama-2-7b architecture, specializing in multitask protein language processing.
Protein Model
Transformers

P
GreatCaptainNemo
650
2
Esm2 T36 3B UR50D
MIT
ESM-2 is a next-generation protein model trained with masked language modeling objectives, suitable for fine-tuning on various downstream tasks with protein sequences as input.
Protein Model
Transformers

E
facebook
3.5M
22
Esm2 T30 150M UR50D
MIT
ESM-2 is a state-of-the-art protein model trained on masked language modeling objectives, suitable for fine-tuning on various protein sequence input tasks.
Protein Model
Transformers

E
facebook
69.91k
7
Esm2 T12 35M UR50D
MIT
ESM-2 is a cutting-edge protein model trained on masked language modeling objectives, suitable for various protein sequence analysis tasks
Protein Model
Transformers

E
facebook
332.83k
15
Esm2 T6 8M UR50D
MIT
ESM-2 is a next-generation protein model trained with masked language modeling objectives, suitable for fine-tuning on various protein sequence tasks.
Protein Model
Transformers

E
facebook
1.5M
21
Prot Bert
A protein sequence pre-training model based on BERT architecture, capturing biophysical properties of protein sequences through self-supervised learning
Protein Model
Transformers

P
Rostlab
276.10k
111
Featured Recommended AI Models